Hi, welcome to the second tutorial session, the virtual one.
And we're going to start with exercise number, I think, four, which said to analyze the following
linear inverse problem of this type.
And we had specific numerical values here, one, two, and three.
So what's happening, we're trying to infer x1 and x2.
So those are our unknown parameters.
And we don't get the chance to measure those parameters individually, but we get three
different supposedly noisy measurements of their sum.
So the idea is to get from this very jumbled data, and this is contradictory data, as you
can see, so x1 plus x2 cannot be at the same time be equal to one, two, and three.
And so this is definitely an ill-posed inverse problem.
So there's no analytical solution, or any solution at all.
So no solution to this linear problem, which means that Halemar's first condition is violated,
which means that this is indeed an ill-posed inverse problem.
So there cannot be a solution to this linear system here.
The idea, of course, will be to use the tools we have talked about.
And the first step in order to do this is to compute the singular value decomposition
of this forward problem, of the matrix generating this forward problem.
So we're going to write this as y equal to a times x, where a is a matrix, so a is 1,
1, 1, 1, 1, 1, this kind of matrix.
And multiplication of this matrix with x1, x2 gives us the data.
How do we get the singular value decomposition of a?
Let us instead consider b equal to a transposed, which is 1, 1, 1, 1, 1, 1.
Why?
Because the algorithm I showed you in the lecture is, well, it's set up in a way such
that there are at most as many columns as rows.
Sorry, at most, so there might be more columns, but not more rows.
So that's the point here.
You can do the same thing for matrices like that, but then you have to redo the algorithm,
take care of all the missing entries and things like that.
It's easier to just transpose the matrix and compute the SVD for this matrix and recover
the SVD of a from that.
So how do we get the SVD of a from the SVD of b?
So let's assume that b is u times sigma times v transposed.
Then that means that a, well, this is b transposed.
Well transposing such a product means you have to transpose all the individual matrices,
but go from the back to the front.
So v transposed transposed is v again, sigma transposed, u transposed, and that is then
the proper SVD.
So you can recover the SVD of a very easily from the SVD of b.
So we're going to compute the SVD of b so that we can use the algorithm that I showed
you in the lectures.
Okay, so what's the first step?
First we have to compute b transposed b.
So suppose b is, well we can do this very easily by hand.
Well there's one plus one, so this is two, two, two, two, two, two, two, two, two, two,
two, so nothing really happening here.
And the first problem that we have is finding all the eigenvalues and eigenvectors of this
matrix here.
So what are the eigenvalues?
Presenters
Zugänglich über
Offener Zugang
Dauer
00:31:34 Min
Aufnahmedatum
2021-12-08
Hochgeladen am
2021-12-08 09:36:03
Sprache
en-US